- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources4
- Resource Type
-
0000000004000000
- More
- Availability
-
31
- Author / Contributor
- Filter by Author / Creator
-
-
Chew, Joyce A. (3)
-
Kazmierczak, Nathanael P. (2)
-
Needell, Deanna (2)
-
Vander Griend, Douglas A. (2)
-
Brouwer, Edward De (1)
-
Chew, Joyce A (1)
-
Drees, Zachary D. (1)
-
Huang, Longxiu (1)
-
Jarman, Benjamin (1)
-
Johnson, David R (1)
-
Kim, Seong Eun (1)
-
Krishnaswamy, Smita (1)
-
Li, Pengyu (1)
-
Michmerhuizen, Anna R. (1)
-
Perlmutter, Michael (1)
-
Rylaarsdam, Andrew (1)
-
Thong, Tasha (1)
-
Tseng, Christine (1)
-
Van Laar, Luke (1)
-
Viswanath, Siddharth (1)
-
- Filter by Editor
-
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
& Spitzer, S.M. (0)
-
(submitted - in Review for IEEE ICASSP-2024) (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract In order to better understand manifold neural networks (MNNs), we introduce Manifold Filter-Combine Networks (MFCNs). Our filter-combine framework parallels the popular aggregate-combine paradigm for graph neural networks (GNNs) and naturally suggests many interesting families of MNNs which can be interpreted as manifold analogues of various popular GNNs. We propose a method for implementing MFCNs on high-dimensional point clouds that relies on approximating an underlying manifold by a sparse graph. We then prove that our method is consistent in the sense that it converges to a continuum limit as the number of data points tends to infinity, and we numerically demonstrate its effectiveness on real-world and synthetic data sets.more » « lessFree, publicly-accessible full text available December 1, 2026
-
Kazmierczak, Nathanael P.; Chew, Joyce A.; Vander Griend, Douglas A. (, Analytica Chimica Acta)
-
Li, Pengyu; Tseng, Christine; Zheng, Yaxuan; Chew, Joyce A.; Huang, Longxiu; Jarman, Benjamin; Needell, Deanna (, Algorithms)Classification and topic modeling are popular techniques in machine learning that extract information from large-scale datasets. By incorporating a priori information such as labels or important features, methods have been developed to perform classification and topic modeling tasks; however, most methods that can perform both do not allow for guidance of the topics or features. In this paper, we propose a novel method, namely Guided Semi-Supervised Non-negative Matrix Factorization (GSSNMF), that performs both classification and topic modeling by incorporating supervision from both pre-assigned document class labels and user-designed seed words. We test the performance of this method on legal documents provided by the California Innocence Project and the 20 Newsgroups dataset. Our results show that the proposed method improves both classification accuracy and topic coherence in comparison to past methods such as Semi-Supervised Non-negative Matrix Factorization (SSNMF), Guided Non-negative Matrix Factorization (Guided NMF), and Topic Supervised NMF.more » « less
-
Kazmierczak, Nathanael P.; Chew, Joyce A.; Michmerhuizen, Anna R.; Kim, Seong Eun; Drees, Zachary D.; Rylaarsdam, Andrew; Thong, Tasha; Van Laar, Luke; Vander Griend, Douglas A. (, Journal of Chemometrics)
An official website of the United States government
